Learning Bayesian Probability Graphical Models and Abduction

نویسنده

  • David Poole
چکیده

In this chapter I review Bayesian statistics as used for induction and relate it to logic based abduction Much reasoning under un certainty including induction is based on Bayes rule Bayes rule is interesting precisely because it provides a mechanism for abduc tion I review work of Buntine that argues that much of the work on Bayesian learning can be best viewed in terms of graphical models such as Bayesian networks and review previous work of Poole that re lates Bayesian networks to logic based abduction This lets us see how much of the work on induction can be viewed in terms of logic based abduction I then explore what this means for extending logic based abduction to richer representations such as learning decision trees with probabilities at the leaves Much of this paper is tutorial in na ture both the probabilistic and logic based notions of abduction and induction are introduced and motivated

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Learning, Bayesian Probability, Graphical Models, and Abduction 1

In this chapter I review Bayesian statistics as used for induction and relate it to logic-based abduction. Much reasoning under uncertainty , including induction, is based on Bayes' rule. Bayes' rule is interesting precisely because it provides a mechanism for abduction. I review work of Buntine that argues that much of the work on Bayesian learning can be best viewed in terms of graphical mode...

متن کامل

Statistical abduction with tabulation1

We propose statistical abduction as a rst-order logical framework for representing, inferring and learning probabilistic knowledge. It semantically integrates logical abduction with a parameterized distribution over abducibles. We show that statistical abduction combined with tabulated search provides an e cient algorithm for probability computation, a Viterbi-like algorithm for nding the most ...

متن کامل

Learning Bayesian Network Structure using Markov Blanket in K2 Algorithm

‎A Bayesian network is a graphical model that represents a set of random variables and their causal relationship via a Directed Acyclic Graph (DAG)‎. ‎There are basically two methods used for learning Bayesian network‎: ‎parameter-learning and structure-learning‎. ‎One of the most effective structure-learning methods is K2 algorithm‎. ‎Because the performance of the K2 algorithm depends on node...

متن کامل

A Viterbi-like algorithm and EM learning for statistical abduction

We propose statistical abduction as a rstorder logical framework for representing and learning probabilistic knowledge. It combines logical abduction with a parameterized distribution over abducibles. We show that probability computation, a Viterbilike algorithm and EM learning for statistical abduction achieve the same eÆciency as specilzed algorithms for HMMs (hidden Markov models), PCFGs (pr...

متن کامل

Bayesian inference for statistical abduction using Markov chain Monte Carlo

Abduction is one of the basic logical inferences (deduction, induction and abduction) and derives the best explanations for our observation. Statistical abduction attempts to define a probability distribution over explanations and to evaluate them by their probabilities. The framework of statistical abduction is general since many well-known probabilistic models, i.e., BNs, HMMs and PCFGs, are ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 1997